Goto

Collaborating Authors

 acceleration neural network


Designing by Training: Acceleration Neural Network for Fast High-Dimensional Convolution

Neural Information Processing Systems

The high-dimensional convolution is widely used in various disciplines but has a serious performance problem due to its high computational complexity. Over the decades, people took a handmade approach to design fast algorithms for the Gaussian convolution. Recently, requirements for various non-Gaussian convolutions have emerged and are continuously getting higher. However, the handmade acceleration approach is no longer feasible for so many different convolutions since it is a time-consuming and painstaking job. Instead, we propose an Acceleration Network (AccNet) which turns the work of designing new fast algorithms to training the AccNet. This is done by: 1, interpreting splatting, blurring, slicing operations as convolutions; 2, turning these convolutions to $g$CP layers to build AccNet. After training, the activation function $g$ together with AccNet weights automatically define the new splatting, blurring and slicing operations. Experiments demonstrate AccNet is able to design acceleration algorithms for a ton of convolutions including Gaussian/non-Gaussian convolutions and produce state-of-the-art results.


Reviews: Designing by Training: Acceleration Neural Network for Fast High-Dimensional Convolution

Neural Information Processing Systems

I am happy with the responses they have provided to my initial concerns which have improved the manuscript. I would encourage authors to add an appendix should they believe they can convey a more complete message without having to wait for drafting another another longer manuscript. They utilise the new Acceleration Network (AccNet) to express the approximation function of splatting/blurring/slicing (SBS) and generalise SBS by changing the architecture of AccNet. Upon finishing the training process, the proposed fast convolution algorithm can be derived from the weights and activation functions of each layer. They conducted various experiments that prove the effectiveness of the proposed algorithm.


Designing by Training: Acceleration Neural Network for Fast High-Dimensional Convolution

Dai, Longquan, Tang, Liang, Xie, Yuan, Tang, Jinhui

Neural Information Processing Systems

The high-dimensional convolution is widely used in various disciplines but has a serious performance problem due to its high computational complexity. Over the decades, people took a handmade approach to design fast algorithms for the Gaussian convolution. Recently, requirements for various non-Gaussian convolutions have emerged and are continuously getting higher. However, the handmade acceleration approach is no longer feasible for so many different convolutions since it is a time-consuming and painstaking job. Instead, we propose an Acceleration Network (AccNet) which turns the work of designing new fast algorithms to training the AccNet.